Skip to content

feat: Add AmortizedVIPosterior for amortized variational inference#1751

Open
janfb wants to merge 21 commits intomainfrom
add-amortized-vip
Open

feat: Add AmortizedVIPosterior for amortized variational inference#1751
janfb wants to merge 21 commits intomainfrom
add-amortized-vip

Conversation

@janfb
Copy link
Contributor

@janfb janfb commented Feb 2, 2026

Summary

This PR implements amortized variational inference via a new AmortizedVIPosterior class, addressing #909.

Note on AI usage: I used Claude Code to help implementing this. I did many iterations and careful reviewing myself and with a very critical Codex 5.2 reviewer.

Context

The existing VIPosterior trains an unconditional variational distribution q(θ) for a fixed observation x_o. This requires retraining for every new observation, which is inefficient in scenarios requiring inference across many observations.

Amortized VI addresses this by learning a conditional distribution q(θ|x) that generalizes across observations. Once trained on simulation data (θ, x), the posterior can provide instant samples for any new x without retraining.

As part of this PR, we also align MAP estimation with the base posterior logic (potential-based) and keep sampling output shapes consistent across posteriors.

Implementation

We introduce AmortizedVIPosterior, which trains a conditional normalizing flow q(θ|x) by optimizing the ELBO against a potential function from NLE/NRE:

from sbi.inference import AmortizedVIPosterior, ZukoFlowType

posterior = AmortizedVIPosterior(
    potential_fn=potential_fn,
    prior=prior,
    flow_type=ZukoFlowType.NSF,
)

posterior.train(theta, x, max_num_iters=1000)

# Works for any observation without retraining
samples = posterior.sample((1000,), x=x_new)
samples_batch = posterior.sample_batched((1000,), x=x_batch)

A ZukoFlowType enum provides type-safe selection of flow architectures (NSF, MAF, NAF, UNAF, SOSPF, NICE, GF, NCSF, BPF).

Design Choices

Separate class vs extending VIPosterior

We created a new class rather than adding an amortized mode to VIPosterior:

  • Clear separation of concerns (unconditional vs conditional flows)
  • Different training signatures (train() vs train(theta, x))
  • Avoids conditional logic complexity in VIPosterior

ZukoFlowType enum scoping

The ZukoFlowType enum is defined in sbi.neural_nets.factory and reused here:

  • Covers Zuko flows with efficient log_prob (NSF, MAF, NAF, UNAF, SOSPF, NICE, GF, NCSF, BPF)
  • Keeps a shared, Zuko-specific enum for other builders

Potential function interface

The implementation uses the existing potential_fn.set_x() pattern for efficiency. This makes the class non-thread-safe, which is documented in the class docstring.

Testing

We validate correctness using a linear Gaussian problem where the true posterior is
analytically known. Tests verify:

  • Accuracy via C2ST against ground truth samples
  • Comparison with standard VIPosterior on the same problem
  • Gradient flow through the ELBO to all flow parameters
  • Correct error handling for edge cases (missing x, untrained model)
  • MAP estimation returns high-density regions
  • Validation batching options to control ELBO evaluation cost

Closes #909

janfb and others added 6 commits January 13, 2026 16:45
…r; update references in implementation and tests
- Reuse Zuko flow enum,
- align MAP with potential-based logic,
- tighten sampling/validation behavior while updating tests and docs.
- Fix VIPosterior.to() to return self for method chaining (matches AmortizedVIPosterior)
- Add theta dimension validation in AmortizedVIPosterior.train() to catch mismatches early
- Remove ZukoFlowType from top-level sbi.inference exports (still available via sbi.inference.posteriors)
- Update test imports accordingly

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
@janfb janfb self-assigned this Feb 2, 2026
@codecov
Copy link

codecov bot commented Feb 2, 2026

❌ 15 Tests Failed:

Tests completed Failed Passed Skipped
5837 15 5822 146
View the top 3 failed test(s) by shortest run time
tests/mnle_test.py::test_mnle_api[none-mdn-vi]
Stack Traces | 0.048s run time
flow_model = 'mdn', sampler = 'vi'
mcmc_params_fast = MCMCPosteriorParameters(method='slice_np_vectorized', thin=1, warmup_steps=1, num_chains=1, init_strategy='resample', init_strategy_parameters=None, num_workers=1, mp_context='spawn')
z_score_theta = 'none'

    @pytest.mark.parametrize(
        "sampler", (pytest.param("mcmc", marks=pytest.mark.mcmc), "rejection", "vi")
    )
    @pytest.mark.parametrize("flow_model", ("mdn", "nsf", "zuko_nsf"))
    @pytest.mark.parametrize("z_score_theta", ("independent", "none"))
    def test_mnle_api(
        flow_model: str,
        sampler,
        mcmc_params_fast: MCMCPosteriorParameters,
        z_score_theta: str,
    ):
        """Test MNLE API."""
        # Generate mixed data.
        num_simulations = 10
        theta = torch.rand(num_simulations, 2)
        x = torch.cat(
            (
                torch.rand(num_simulations, 1),
                torch.randint(0, 2, (num_simulations, 1)),
            ),
            dim=1,
        )
    
        # Train and infer.
        prior = BoxUniform(torch.zeros(2), torch.ones(2))
        x_o = x[0]
        # Build estimator manually.
        theta_embedding = FCEmbedding(2, 2)  # simple embedding net
        density_estimator = likelihood_nn(
            model="mnle",
            flow_model=flow_model,
            z_score_theta=z_score_theta,
            embedding_net=theta_embedding,
        )
        trainer = MNLE(density_estimator=density_estimator)
        trainer.append_simulations(theta, x).train(max_num_epochs=1)
    
        # Test different samplers.
>       posterior = trainer.build_posterior(prior=prior, sample_with=sampler)

tests/mnle_test.py:132: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
.../trainers/nle/mnle.py:176: in build_posterior
    return super().build_posterior(
.../trainers/nle/nle_base.py:291: in build_posterior
    return super().build_posterior(
.../inference/trainers/base.py:507: in build_posterior
    self._posterior = self._create_posterior(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <sbi.inference.trainers.nle.mnle.MNLE object at 0x7f8ccfd896f0>
estimator = MixedDensityEstimator(
  (net): NFlowsFlow(
    (net): Flow(
      (_transform): CompositeTransform(
        (_transfo...bias=True)
      (1): ReLU()
      (2): Linear(in_features=50, out_features=2, bias=True)
      (3): ReLU()
    )
  )
)
prior = BoxUniform(Uniform(low: torch.Size([2]), high: torch.Size([2])), 1)
sample_with = 'vi', device = 'cpu'
posterior_parameters = VIPosteriorParameters(q='maf', vi_method='rKL', num_transforms=5, hidden_features=50, z_score_theta='independent', z_score_x='independent')

    def _create_posterior(
        self,
        estimator: ConditionalEstimator,
        prior: Distribution,
        sample_with: Literal[
            "mcmc", "rejection", "vi", "importance", "direct", "sde", "ode"
        ],
        device: Union[str, torch.device],
        posterior_parameters: PosteriorParameters,
    ) -> NeuralPosterior:
        """
        Create a posterior object using the specified inference method.
    
        Depending on the value of `sample_with`, this method instantiates one of the
        supported posterior inference strategies.
    
        Args:
            estimator: The estimator that the posterior is based on.
            prior: A probability distribution that expresses prior knowledge about the
                parameters, e.g. which ranges are meaningful for them. Must be a PyTorch
                distribution, see FAQ for details on how to use custom distributions.
            sample_with: The inference method to use. Must be one of:
                - "mcmc"
                - "rejection"
                - "vi"
                - "importance"
                - "direct"
                - "sde"
                - "ode"
            device: torch device on which to train the neural net and on which to
                perform all posterior operations, e.g. gpu or cpu.
            posterior_parameters: Configuration passed to the init method for the
                posterior. Must be of type PosteriorParameters.
    
        Returns:
            NeuralPosterior object.
        """
    
        if isinstance(posterior_parameters, DirectPosteriorParameters):
            posterior_estimator = estimator
            if not isinstance(posterior_estimator, ConditionalDensityEstimator):
                raise TypeError(
                    f"Expected posterior_estimator to be an instance of "
                    " ConditionalDensityEstimator, "
                    f"but got {type(posterior_estimator).__name__} instead."
                )
            posterior = DirectPosterior(
                posterior_estimator=posterior_estimator,
                prior=prior,
                device=device,
                **asdict(posterior_parameters),
            )
        elif isinstance(posterior_parameters, VectorFieldPosteriorParameters):
            vector_field_estimator = estimator
            if not isinstance(vector_field_estimator, ConditionalVectorFieldEstimator):
                raise TypeError(
                    f"Expected vector_field_estimator to be an instance of "
                    " ConditionalVectorFieldEstimator, "
                    f"but got {type(vector_field_estimator).__name__} instead."
                )
            if sample_with not in ("ode", "sde"):
                raise ValueError(
                    "`sample_with` must be either",
                    f" 'ode' or 'sde', got '{sample_with}'",
                )
            posterior = VectorFieldPosterior(
                vector_field_estimator=vector_field_estimator,
                prior=prior,
                device=device,
                sample_with=sample_with,
                **asdict(posterior_parameters),
            )
        else:
            # Posteriors requiring potential_fn and theta_transform
            potential_fn, theta_transform = self._get_potential_function(
                prior, estimator
            )
            if isinstance(posterior_parameters, MCMCPosteriorParameters):
                posterior = MCMCPosterior(
                    potential_fn=potential_fn,
                    theta_transform=theta_transform,
                    proposal=prior,
                    device=device,
                    **asdict(posterior_parameters),
                )
            elif isinstance(posterior_parameters, RejectionPosteriorParameters):
                posterior = RejectionPosterior(
                    potential_fn=potential_fn,
                    proposal=prior,
                    device=device,
                    **asdict(posterior_parameters),
                )
            elif isinstance(posterior_parameters, VIPosteriorParameters):
>               posterior = VIPosterior(
                    potential_fn=potential_fn,
                    theta_transform=theta_transform,
                    prior=prior,
                    device=device,
                    **asdict(posterior_parameters),
E                   TypeError: VIPosterior.__init__() got an unexpected keyword argument 'num_transforms'

.../inference/trainers/base.py:899: TypeError
tests/mnle_test.py::test_mnle_api[independent-mdn-vi]
Stack Traces | 0.05s run time
flow_model = 'mdn', sampler = 'vi'
mcmc_params_fast = MCMCPosteriorParameters(method='slice_np_vectorized', thin=1, warmup_steps=1, num_chains=1, init_strategy='resample', init_strategy_parameters=None, num_workers=1, mp_context='spawn')
z_score_theta = 'independent'

    @pytest.mark.parametrize(
        "sampler", (pytest.param("mcmc", marks=pytest.mark.mcmc), "rejection", "vi")
    )
    @pytest.mark.parametrize("flow_model", ("mdn", "nsf", "zuko_nsf"))
    @pytest.mark.parametrize("z_score_theta", ("independent", "none"))
    def test_mnle_api(
        flow_model: str,
        sampler,
        mcmc_params_fast: MCMCPosteriorParameters,
        z_score_theta: str,
    ):
        """Test MNLE API."""
        # Generate mixed data.
        num_simulations = 10
        theta = torch.rand(num_simulations, 2)
        x = torch.cat(
            (
                torch.rand(num_simulations, 1),
                torch.randint(0, 2, (num_simulations, 1)),
            ),
            dim=1,
        )
    
        # Train and infer.
        prior = BoxUniform(torch.zeros(2), torch.ones(2))
        x_o = x[0]
        # Build estimator manually.
        theta_embedding = FCEmbedding(2, 2)  # simple embedding net
        density_estimator = likelihood_nn(
            model="mnle",
            flow_model=flow_model,
            z_score_theta=z_score_theta,
            embedding_net=theta_embedding,
        )
        trainer = MNLE(density_estimator=density_estimator)
        trainer.append_simulations(theta, x).train(max_num_epochs=1)
    
        # Test different samplers.
>       posterior = trainer.build_posterior(prior=prior, sample_with=sampler)

tests/mnle_test.py:132: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
.../trainers/nle/mnle.py:176: in build_posterior
    return super().build_posterior(
.../trainers/nle/nle_base.py:291: in build_posterior
    return super().build_posterior(
.../inference/trainers/base.py:507: in build_posterior
    self._posterior = self._create_posterior(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <sbi.inference.trainers.nle.mnle.MNLE object at 0x7f8cd411d5a0>
estimator = MixedDensityEstimator(
  (net): NFlowsFlow(
    (net): Flow(
      (_transform): CompositeTransform(
        (_transfo...     (1): ReLU()
        (2): Linear(in_features=50, out_features=2, bias=True)
        (3): ReLU()
      )
    )
  )
)
prior = BoxUniform(Uniform(low: torch.Size([2]), high: torch.Size([2])), 1)
sample_with = 'vi', device = 'cpu'
posterior_parameters = VIPosteriorParameters(q='maf', vi_method='rKL', num_transforms=5, hidden_features=50, z_score_theta='independent', z_score_x='independent')

    def _create_posterior(
        self,
        estimator: ConditionalEstimator,
        prior: Distribution,
        sample_with: Literal[
            "mcmc", "rejection", "vi", "importance", "direct", "sde", "ode"
        ],
        device: Union[str, torch.device],
        posterior_parameters: PosteriorParameters,
    ) -> NeuralPosterior:
        """
        Create a posterior object using the specified inference method.
    
        Depending on the value of `sample_with`, this method instantiates one of the
        supported posterior inference strategies.
    
        Args:
            estimator: The estimator that the posterior is based on.
            prior: A probability distribution that expresses prior knowledge about the
                parameters, e.g. which ranges are meaningful for them. Must be a PyTorch
                distribution, see FAQ for details on how to use custom distributions.
            sample_with: The inference method to use. Must be one of:
                - "mcmc"
                - "rejection"
                - "vi"
                - "importance"
                - "direct"
                - "sde"
                - "ode"
            device: torch device on which to train the neural net and on which to
                perform all posterior operations, e.g. gpu or cpu.
            posterior_parameters: Configuration passed to the init method for the
                posterior. Must be of type PosteriorParameters.
    
        Returns:
            NeuralPosterior object.
        """
    
        if isinstance(posterior_parameters, DirectPosteriorParameters):
            posterior_estimator = estimator
            if not isinstance(posterior_estimator, ConditionalDensityEstimator):
                raise TypeError(
                    f"Expected posterior_estimator to be an instance of "
                    " ConditionalDensityEstimator, "
                    f"but got {type(posterior_estimator).__name__} instead."
                )
            posterior = DirectPosterior(
                posterior_estimator=posterior_estimator,
                prior=prior,
                device=device,
                **asdict(posterior_parameters),
            )
        elif isinstance(posterior_parameters, VectorFieldPosteriorParameters):
            vector_field_estimator = estimator
            if not isinstance(vector_field_estimator, ConditionalVectorFieldEstimator):
                raise TypeError(
                    f"Expected vector_field_estimator to be an instance of "
                    " ConditionalVectorFieldEstimator, "
                    f"but got {type(vector_field_estimator).__name__} instead."
                )
            if sample_with not in ("ode", "sde"):
                raise ValueError(
                    "`sample_with` must be either",
                    f" 'ode' or 'sde', got '{sample_with}'",
                )
            posterior = VectorFieldPosterior(
                vector_field_estimator=vector_field_estimator,
                prior=prior,
                device=device,
                sample_with=sample_with,
                **asdict(posterior_parameters),
            )
        else:
            # Posteriors requiring potential_fn and theta_transform
            potential_fn, theta_transform = self._get_potential_function(
                prior, estimator
            )
            if isinstance(posterior_parameters, MCMCPosteriorParameters):
                posterior = MCMCPosterior(
                    potential_fn=potential_fn,
                    theta_transform=theta_transform,
                    proposal=prior,
                    device=device,
                    **asdict(posterior_parameters),
                )
            elif isinstance(posterior_parameters, RejectionPosteriorParameters):
                posterior = RejectionPosterior(
                    potential_fn=potential_fn,
                    proposal=prior,
                    device=device,
                    **asdict(posterior_parameters),
                )
            elif isinstance(posterior_parameters, VIPosteriorParameters):
>               posterior = VIPosterior(
                    potential_fn=potential_fn,
                    theta_transform=theta_transform,
                    prior=prior,
                    device=device,
                    **asdict(posterior_parameters),
E                   TypeError: VIPosterior.__init__() got an unexpected keyword argument 'num_transforms'

.../inference/trainers/base.py:899: TypeError
tests/mnle_test.py::test_mnle_api[none-nsf-vi]
Stack Traces | 0.096s run time
flow_model = 'nsf', sampler = 'vi'
mcmc_params_fast = MCMCPosteriorParameters(method='slice_np_vectorized', thin=1, warmup_steps=1, num_chains=1, init_strategy='resample', init_strategy_parameters=None, num_workers=1, mp_context='spawn')
z_score_theta = 'none'

    @pytest.mark.parametrize(
        "sampler", (pytest.param("mcmc", marks=pytest.mark.mcmc), "rejection", "vi")
    )
    @pytest.mark.parametrize("flow_model", ("mdn", "nsf", "zuko_nsf"))
    @pytest.mark.parametrize("z_score_theta", ("independent", "none"))
    def test_mnle_api(
        flow_model: str,
        sampler,
        mcmc_params_fast: MCMCPosteriorParameters,
        z_score_theta: str,
    ):
        """Test MNLE API."""
        # Generate mixed data.
        num_simulations = 10
        theta = torch.rand(num_simulations, 2)
        x = torch.cat(
            (
                torch.rand(num_simulations, 1),
                torch.randint(0, 2, (num_simulations, 1)),
            ),
            dim=1,
        )
    
        # Train and infer.
        prior = BoxUniform(torch.zeros(2), torch.ones(2))
        x_o = x[0]
        # Build estimator manually.
        theta_embedding = FCEmbedding(2, 2)  # simple embedding net
        density_estimator = likelihood_nn(
            model="mnle",
            flow_model=flow_model,
            z_score_theta=z_score_theta,
            embedding_net=theta_embedding,
        )
        trainer = MNLE(density_estimator=density_estimator)
        trainer.append_simulations(theta, x).train(max_num_epochs=1)
    
        # Test different samplers.
>       posterior = trainer.build_posterior(prior=prior, sample_with=sampler)

tests/mnle_test.py:132: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
.../trainers/nle/mnle.py:176: in build_posterior
    return super().build_posterior(
.../trainers/nle/nle_base.py:291: in build_posterior
    return super().build_posterior(
.../inference/trainers/base.py:507: in build_posterior
    self._posterior = self._create_posterior(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <sbi.inference.trainers.nle.mnle.MNLE object at 0x7f2e041f0b80>
estimator = MixedDensityEstimator(
  (net): NFlowsFlow(
    (net): Flow(
      (_transform): CompositeTransform(
        (_transfo...bias=True)
      (1): ReLU()
      (2): Linear(in_features=50, out_features=2, bias=True)
      (3): ReLU()
    )
  )
)
prior = BoxUniform(Uniform(low: torch.Size([2]), high: torch.Size([2])), 1)
sample_with = 'vi', device = 'cpu'
posterior_parameters = VIPosteriorParameters(q='maf', vi_method='rKL', num_transforms=5, hidden_features=50, z_score_theta='independent', z_score_x='independent')

    def _create_posterior(
        self,
        estimator: ConditionalEstimator,
        prior: Distribution,
        sample_with: Literal[
            "mcmc", "rejection", "vi", "importance", "direct", "sde", "ode"
        ],
        device: Union[str, torch.device],
        posterior_parameters: PosteriorParameters,
    ) -> NeuralPosterior:
        """
        Create a posterior object using the specified inference method.
    
        Depending on the value of `sample_with`, this method instantiates one of the
        supported posterior inference strategies.
    
        Args:
            estimator: The estimator that the posterior is based on.
            prior: A probability distribution that expresses prior knowledge about the
                parameters, e.g. which ranges are meaningful for them. Must be a PyTorch
                distribution, see FAQ for details on how to use custom distributions.
            sample_with: The inference method to use. Must be one of:
                - "mcmc"
                - "rejection"
                - "vi"
                - "importance"
                - "direct"
                - "sde"
                - "ode"
            device: torch device on which to train the neural net and on which to
                perform all posterior operations, e.g. gpu or cpu.
            posterior_parameters: Configuration passed to the init method for the
                posterior. Must be of type PosteriorParameters.
    
        Returns:
            NeuralPosterior object.
        """
    
        if isinstance(posterior_parameters, DirectPosteriorParameters):
            posterior_estimator = estimator
            if not isinstance(posterior_estimator, ConditionalDensityEstimator):
                raise TypeError(
                    f"Expected posterior_estimator to be an instance of "
                    " ConditionalDensityEstimator, "
                    f"but got {type(posterior_estimator).__name__} instead."
                )
            posterior = DirectPosterior(
                posterior_estimator=posterior_estimator,
                prior=prior,
                device=device,
                **asdict(posterior_parameters),
            )
        elif isinstance(posterior_parameters, VectorFieldPosteriorParameters):
            vector_field_estimator = estimator
            if not isinstance(vector_field_estimator, ConditionalVectorFieldEstimator):
                raise TypeError(
                    f"Expected vector_field_estimator to be an instance of "
                    " ConditionalVectorFieldEstimator, "
                    f"but got {type(vector_field_estimator).__name__} instead."
                )
            if sample_with not in ("ode", "sde"):
                raise ValueError(
                    "`sample_with` must be either",
                    f" 'ode' or 'sde', got '{sample_with}'",
                )
            posterior = VectorFieldPosterior(
                vector_field_estimator=vector_field_estimator,
                prior=prior,
                device=device,
                sample_with=sample_with,
                **asdict(posterior_parameters),
            )
        else:
            # Posteriors requiring potential_fn and theta_transform
            potential_fn, theta_transform = self._get_potential_function(
                prior, estimator
            )
            if isinstance(posterior_parameters, MCMCPosteriorParameters):
                posterior = MCMCPosterior(
                    potential_fn=potential_fn,
                    theta_transform=theta_transform,
                    proposal=prior,
                    device=device,
                    **asdict(posterior_parameters),
                )
            elif isinstance(posterior_parameters, RejectionPosteriorParameters):
                posterior = RejectionPosterior(
                    potential_fn=potential_fn,
                    proposal=prior,
                    device=device,
                    **asdict(posterior_parameters),
                )
            elif isinstance(posterior_parameters, VIPosteriorParameters):
>               posterior = VIPosterior(
                    potential_fn=potential_fn,
                    theta_transform=theta_transform,
                    prior=prior,
                    device=device,
                    **asdict(posterior_parameters),
E                   TypeError: VIPosterior.__init__() got an unexpected keyword argument 'num_transforms'

.../inference/trainers/base.py:899: TypeError
tests/mnle_test.py::test_mnle_api[independent-nsf-vi]
Stack Traces | 0.097s run time
flow_model = 'nsf', sampler = 'vi'
mcmc_params_fast = MCMCPosteriorParameters(method='slice_np_vectorized', thin=1, warmup_steps=1, num_chains=1, init_strategy='resample', init_strategy_parameters=None, num_workers=1, mp_context='spawn')
z_score_theta = 'independent'

    @pytest.mark.parametrize(
        "sampler", (pytest.param("mcmc", marks=pytest.mark.mcmc), "rejection", "vi")
    )
    @pytest.mark.parametrize("flow_model", ("mdn", "nsf", "zuko_nsf"))
    @pytest.mark.parametrize("z_score_theta", ("independent", "none"))
    def test_mnle_api(
        flow_model: str,
        sampler,
        mcmc_params_fast: MCMCPosteriorParameters,
        z_score_theta: str,
    ):
        """Test MNLE API."""
        # Generate mixed data.
        num_simulations = 10
        theta = torch.rand(num_simulations, 2)
        x = torch.cat(
            (
                torch.rand(num_simulations, 1),
                torch.randint(0, 2, (num_simulations, 1)),
            ),
            dim=1,
        )
    
        # Train and infer.
        prior = BoxUniform(torch.zeros(2), torch.ones(2))
        x_o = x[0]
        # Build estimator manually.
        theta_embedding = FCEmbedding(2, 2)  # simple embedding net
        density_estimator = likelihood_nn(
            model="mnle",
            flow_model=flow_model,
            z_score_theta=z_score_theta,
            embedding_net=theta_embedding,
        )
        trainer = MNLE(density_estimator=density_estimator)
        trainer.append_simulations(theta, x).train(max_num_epochs=1)
    
        # Test different samplers.
>       posterior = trainer.build_posterior(prior=prior, sample_with=sampler)

tests/mnle_test.py:132: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
.../trainers/nle/mnle.py:176: in build_posterior
    return super().build_posterior(
.../trainers/nle/nle_base.py:291: in build_posterior
    return super().build_posterior(
.../inference/trainers/base.py:507: in build_posterior
    self._posterior = self._create_posterior(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <sbi.inference.trainers.nle.mnle.MNLE object at 0x7f8ccfdbba00>
estimator = MixedDensityEstimator(
  (net): NFlowsFlow(
    (net): Flow(
      (_transform): CompositeTransform(
        (_transfo...     (1): ReLU()
        (2): Linear(in_features=50, out_features=2, bias=True)
        (3): ReLU()
      )
    )
  )
)
prior = BoxUniform(Uniform(low: torch.Size([2]), high: torch.Size([2])), 1)
sample_with = 'vi', device = 'cpu'
posterior_parameters = VIPosteriorParameters(q='maf', vi_method='rKL', num_transforms=5, hidden_features=50, z_score_theta='independent', z_score_x='independent')

    def _create_posterior(
        self,
        estimator: ConditionalEstimator,
        prior: Distribution,
        sample_with: Literal[
            "mcmc", "rejection", "vi", "importance", "direct", "sde", "ode"
        ],
        device: Union[str, torch.device],
        posterior_parameters: PosteriorParameters,
    ) -> NeuralPosterior:
        """
        Create a posterior object using the specified inference method.
    
        Depending on the value of `sample_with`, this method instantiates one of the
        supported posterior inference strategies.
    
        Args:
            estimator: The estimator that the posterior is based on.
            prior: A probability distribution that expresses prior knowledge about the
                parameters, e.g. which ranges are meaningful for them. Must be a PyTorch
                distribution, see FAQ for details on how to use custom distributions.
            sample_with: The inference method to use. Must be one of:
                - "mcmc"
                - "rejection"
                - "vi"
                - "importance"
                - "direct"
                - "sde"
                - "ode"
            device: torch device on which to train the neural net and on which to
                perform all posterior operations, e.g. gpu or cpu.
            posterior_parameters: Configuration passed to the init method for the
                posterior. Must be of type PosteriorParameters.
    
        Returns:
            NeuralPosterior object.
        """
    
        if isinstance(posterior_parameters, DirectPosteriorParameters):
            posterior_estimator = estimator
            if not isinstance(posterior_estimator, ConditionalDensityEstimator):
                raise TypeError(
                    f"Expected posterior_estimator to be an instance of "
                    " ConditionalDensityEstimator, "
                    f"but got {type(posterior_estimator).__name__} instead."
                )
            posterior = DirectPosterior(
                posterior_estimator=posterior_estimator,
                prior=prior,
                device=device,
                **asdict(posterior_parameters),
            )
        elif isinstance(posterior_parameters, VectorFieldPosteriorParameters):
            vector_field_estimator = estimator
            if not isinstance(vector_field_estimator, ConditionalVectorFieldEstimator):
                raise TypeError(
                    f"Expected vector_field_estimator to be an instance of "
                    " ConditionalVectorFieldEstimator, "
                    f"but got {type(vector_field_estimator).__name__} instead."
                )
            if sample_with not in ("ode", "sde"):
                raise ValueError(
                    "`sample_with` must be either",
                    f" 'ode' or 'sde', got '{sample_with}'",
                )
            posterior = VectorFieldPosterior(
                vector_field_estimator=vector_field_estimator,
                prior=prior,
                device=device,
                sample_with=sample_with,
                **asdict(posterior_parameters),
            )
        else:
            # Posteriors requiring potential_fn and theta_transform
            potential_fn, theta_transform = self._get_potential_function(
                prior, estimator
            )
            if isinstance(posterior_parameters, MCMCPosteriorParameters):
                posterior = MCMCPosterior(
                    potential_fn=potential_fn,
                    theta_transform=theta_transform,
                    proposal=prior,
                    device=device,
                    **asdict(posterior_parameters),
                )
            elif isinstance(posterior_parameters, RejectionPosteriorParameters):
                posterior = RejectionPosterior(
                    potential_fn=potential_fn,
                    proposal=prior,
                    device=device,
                    **asdict(posterior_parameters),
                )
            elif isinstance(posterior_parameters, VIPosteriorParameters):
>               posterior = VIPosterior(
                    potential_fn=potential_fn,
                    theta_transform=theta_transform,
                    prior=prior,
                    device=device,
                    **asdict(posterior_parameters),
E                   TypeError: VIPosterior.__init__() got an unexpected keyword argument 'num_transforms'

.../inference/trainers/base.py:899: TypeError
tests/mnle_test.py::test_mnle_api[none-zuko_nsf-vi]
Stack Traces | 0.111s run time
flow_model = 'zuko_nsf', sampler = 'vi'
mcmc_params_fast = MCMCPosteriorParameters(method='slice_np_vectorized', thin=1, warmup_steps=1, num_chains=1, init_strategy='resample', init_strategy_parameters=None, num_workers=1, mp_context='spawn')
z_score_theta = 'none'

    @pytest.mark.parametrize(
        "sampler", (pytest.param("mcmc", marks=pytest.mark.mcmc), "rejection", "vi")
    )
    @pytest.mark.parametrize("flow_model", ("mdn", "nsf", "zuko_nsf"))
    @pytest.mark.parametrize("z_score_theta", ("independent", "none"))
    def test_mnle_api(
        flow_model: str,
        sampler,
        mcmc_params_fast: MCMCPosteriorParameters,
        z_score_theta: str,
    ):
        """Test MNLE API."""
        # Generate mixed data.
        num_simulations = 10
        theta = torch.rand(num_simulations, 2)
        x = torch.cat(
            (
                torch.rand(num_simulations, 1),
                torch.randint(0, 2, (num_simulations, 1)),
            ),
            dim=1,
        )
    
        # Train and infer.
        prior = BoxUniform(torch.zeros(2), torch.ones(2))
        x_o = x[0]
        # Build estimator manually.
        theta_embedding = FCEmbedding(2, 2)  # simple embedding net
        density_estimator = likelihood_nn(
            model="mnle",
            flow_model=flow_model,
            z_score_theta=z_score_theta,
            embedding_net=theta_embedding,
        )
        trainer = MNLE(density_estimator=density_estimator)
        trainer.append_simulations(theta, x).train(max_num_epochs=1)
    
        # Test different samplers.
>       posterior = trainer.build_posterior(prior=prior, sample_with=sampler)

tests/mnle_test.py:132: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
.../trainers/nle/mnle.py:176: in build_posterior
    return super().build_posterior(
.../trainers/nle/nle_base.py:291: in build_posterior
    return super().build_posterior(
.../inference/trainers/base.py:507: in build_posterior
    self._posterior = self._create_posterior(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <sbi.inference.trainers.nle.mnle.MNLE object at 0x7f215ab33f40>
estimator = MixedDensityEstimator(
  (net): ZukoFlow(
    (net): Flow(
      (transform): LazyComposedTransform(
        (0): Unco...bias=True)
      (1): ReLU()
      (2): Linear(in_features=50, out_features=2, bias=True)
      (3): ReLU()
    )
  )
)
prior = BoxUniform(Uniform(low: torch.Size([2]), high: torch.Size([2])), 1)
sample_with = 'vi', device = 'cpu'
posterior_parameters = VIPosteriorParameters(q='maf', vi_method='rKL', num_transforms=5, hidden_features=50, z_score_theta='independent', z_score_x='independent')

    def _create_posterior(
        self,
        estimator: ConditionalEstimator,
        prior: Distribution,
        sample_with: Literal[
            "mcmc", "rejection", "vi", "importance", "direct", "sde", "ode"
        ],
        device: Union[str, torch.device],
        posterior_parameters: PosteriorParameters,
    ) -> NeuralPosterior:
        """
        Create a posterior object using the specified inference method.
    
        Depending on the value of `sample_with`, this method instantiates one of the
        supported posterior inference strategies.
    
        Args:
            estimator: The estimator that the posterior is based on.
            prior: A probability distribution that expresses prior knowledge about the
                parameters, e.g. which ranges are meaningful for them. Must be a PyTorch
                distribution, see FAQ for details on how to use custom distributions.
            sample_with: The inference method to use. Must be one of:
                - "mcmc"
                - "rejection"
                - "vi"
                - "importance"
                - "direct"
                - "sde"
                - "ode"
            device: torch device on which to train the neural net and on which to
                perform all posterior operations, e.g. gpu or cpu.
            posterior_parameters: Configuration passed to the init method for the
                posterior. Must be of type PosteriorParameters.
    
        Returns:
            NeuralPosterior object.
        """
    
        if isinstance(posterior_parameters, DirectPosteriorParameters):
            posterior_estimator = estimator
            if not isinstance(posterior_estimator, ConditionalDensityEstimator):
                raise TypeError(
                    f"Expected posterior_estimator to be an instance of "
                    " ConditionalDensityEstimator, "
                    f"but got {type(posterior_estimator).__name__} instead."
                )
            posterior = DirectPosterior(
                posterior_estimator=posterior_estimator,
                prior=prior,
                device=device,
                **asdict(posterior_parameters),
            )
        elif isinstance(posterior_parameters, VectorFieldPosteriorParameters):
            vector_field_estimator = estimator
            if not isinstance(vector_field_estimator, ConditionalVectorFieldEstimator):
                raise TypeError(
                    f"Expected vector_field_estimator to be an instance of "
                    " ConditionalVectorFieldEstimator, "
                    f"but got {type(vector_field_estimator).__name__} instead."
                )
            if sample_with not in ("ode", "sde"):
                raise ValueError(
                    "`sample_with` must be either",
                    f" 'ode' or 'sde', got '{sample_with}'",
                )
            posterior = VectorFieldPosterior(
                vector_field_estimator=vector_field_estimator,
                prior=prior,
                device=device,
                sample_with=sample_with,
                **asdict(posterior_parameters),
            )
        else:
            # Posteriors requiring potential_fn and theta_transform
            potential_fn, theta_transform = self._get_potential_function(
                prior, estimator
            )
            if isinstance(posterior_parameters, MCMCPosteriorParameters):
                posterior = MCMCPosterior(
                    potential_fn=potential_fn,
                    theta_transform=theta_transform,
                    proposal=prior,
                    device=device,
                    **asdict(posterior_parameters),
                )
            elif isinstance(posterior_parameters, RejectionPosteriorParameters):
                posterior = RejectionPosterior(
                    potential_fn=potential_fn,
                    proposal=prior,
                    device=device,
                    **asdict(posterior_parameters),
                )
            elif isinstance(posterior_parameters, VIPosteriorParameters):
>               posterior = VIPosterior(
                    potential_fn=potential_fn,
                    theta_transform=theta_transform,
                    prior=prior,
                    device=device,
                    **asdict(posterior_parameters),
E                   TypeError: VIPosterior.__init__() got an unexpected keyword argument 'num_transforms'

.../inference/trainers/base.py:899: TypeError
tests/mnle_test.py::test_mnle_api[independent-zuko_nsf-vi]
Stack Traces | 0.117s run time
flow_model = 'zuko_nsf', sampler = 'vi'
mcmc_params_fast = MCMCPosteriorParameters(method='slice_np_vectorized', thin=1, warmup_steps=1, num_chains=1, init_strategy='resample', init_strategy_parameters=None, num_workers=1, mp_context='spawn')
z_score_theta = 'independent'

    @pytest.mark.parametrize(
        "sampler", (pytest.param("mcmc", marks=pytest.mark.mcmc), "rejection", "vi")
    )
    @pytest.mark.parametrize("flow_model", ("mdn", "nsf", "zuko_nsf"))
    @pytest.mark.parametrize("z_score_theta", ("independent", "none"))
    def test_mnle_api(
        flow_model: str,
        sampler,
        mcmc_params_fast: MCMCPosteriorParameters,
        z_score_theta: str,
    ):
        """Test MNLE API."""
        # Generate mixed data.
        num_simulations = 10
        theta = torch.rand(num_simulations, 2)
        x = torch.cat(
            (
                torch.rand(num_simulations, 1),
                torch.randint(0, 2, (num_simulations, 1)),
            ),
            dim=1,
        )
    
        # Train and infer.
        prior = BoxUniform(torch.zeros(2), torch.ones(2))
        x_o = x[0]
        # Build estimator manually.
        theta_embedding = FCEmbedding(2, 2)  # simple embedding net
        density_estimator = likelihood_nn(
            model="mnle",
            flow_model=flow_model,
            z_score_theta=z_score_theta,
            embedding_net=theta_embedding,
        )
        trainer = MNLE(density_estimator=density_estimator)
        trainer.append_simulations(theta, x).train(max_num_epochs=1)
    
        # Test different samplers.
>       posterior = trainer.build_posterior(prior=prior, sample_with=sampler)

tests/mnle_test.py:132: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
.../trainers/nle/mnle.py:176: in build_posterior
    return super().build_posterior(
.../trainers/nle/nle_base.py:291: in build_posterior
    return super().build_posterior(
.../inference/trainers/base.py:507: in build_posterior
    self._posterior = self._create_posterior(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <sbi.inference.trainers.nle.mnle.MNLE object at 0x7f2e0420cca0>
estimator = MixedDensityEstimator(
  (net): ZukoFlow(
    (net): Flow(
      (transform): LazyComposedTransform(
        (0): Unco...     (1): ReLU()
        (2): Linear(in_features=50, out_features=2, bias=True)
        (3): ReLU()
      )
    )
  )
)
prior = BoxUniform(Uniform(low: torch.Size([2]), high: torch.Size([2])), 1)
sample_with = 'vi', device = 'cpu'
posterior_parameters = VIPosteriorParameters(q='maf', vi_method='rKL', num_transforms=5, hidden_features=50, z_score_theta='independent', z_score_x='independent')

    def _create_posterior(
        self,
        estimator: ConditionalEstimator,
        prior: Distribution,
        sample_with: Literal[
            "mcmc", "rejection", "vi", "importance", "direct", "sde", "ode"
        ],
        device: Union[str, torch.device],
        posterior_parameters: PosteriorParameters,
    ) -> NeuralPosterior:
        """
        Create a posterior object using the specified inference method.
    
        Depending on the value of `sample_with`, this method instantiates one of the
        supported posterior inference strategies.
    
        Args:
            estimator: The estimator that the posterior is based on.
            prior: A probability distribution that expresses prior knowledge about the
                parameters, e.g. which ranges are meaningful for them. Must be a PyTorch
                distribution, see FAQ for details on how to use custom distributions.
            sample_with: The inference method to use. Must be one of:
                - "mcmc"
                - "rejection"
                - "vi"
                - "importance"
                - "direct"
                - "sde"
                - "ode"
            device: torch device on which to train the neural net and on which to
                perform all posterior operations, e.g. gpu or cpu.
            posterior_parameters: Configuration passed to the init method for the
                posterior. Must be of type PosteriorParameters.
    
        Returns:
            NeuralPosterior object.
        """
    
        if isinstance(posterior_parameters, DirectPosteriorParameters):
            posterior_estimator = estimator
            if not isinstance(posterior_estimator, ConditionalDensityEstimator):
                raise TypeError(
                    f"Expected posterior_estimator to be an instance of "
                    " ConditionalDensityEstimator, "
                    f"but got {type(posterior_estimator).__name__} instead."
                )
            posterior = DirectPosterior(
                posterior_estimator=posterior_estimator,
                prior=prior,
                device=device,
                **asdict(posterior_parameters),
            )
        elif isinstance(posterior_parameters, VectorFieldPosteriorParameters):
            vector_field_estimator = estimator
            if not isinstance(vector_field_estimator, ConditionalVectorFieldEstimator):
                raise TypeError(
                    f"Expected vector_field_estimator to be an instance of "
                    " ConditionalVectorFieldEstimator, "
                    f"but got {type(vector_field_estimator).__name__} instead."
                )
            if sample_with not in ("ode", "sde"):
                raise ValueError(
                    "`sample_with` must be either",
                    f" 'ode' or 'sde', got '{sample_with}'",
                )
            posterior = VectorFieldPosterior(
                vector_field_estimator=vector_field_estimator,
                prior=prior,
                device=device,
                sample_with=sample_with,
                **asdict(posterior_parameters),
            )
        else:
            # Posteriors requiring potential_fn and theta_transform
            potential_fn, theta_transform = self._get_potential_function(
                prior, estimator
            )
            if isinstance(posterior_parameters, MCMCPosteriorParameters):
                posterior = MCMCPosterior(
                    potential_fn=potential_fn,
                    theta_transform=theta_transform,
                    proposal=prior,
                    device=device,
                    **asdict(posterior_parameters),
                )
            elif isinstance(posterior_parameters, RejectionPosteriorParameters):
                posterior = RejectionPosterior(
                    potential_fn=potential_fn,
                    proposal=prior,
                    device=device,
                    **asdict(posterior_parameters),
                )
            elif isinstance(posterior_parameters, VIPosteriorParameters):
>               posterior = VIPosterior(
                    potential_fn=potential_fn,
                    theta_transform=theta_transform,
                    prior=prior,
                    device=device,
                    **asdict(posterior_parameters),
E                   TypeError: VIPosterior.__init__() got an unexpected keyword argument 'num_transforms'

.../inference/trainers/base.py:899: TypeError
View the full list of 9 ❄️ flaky test(s)
tests/posterior_parameters_test.py::test_build_posterior_warns_on_conflicting_args[build_posterior_arguments1]

Flake rate in main: 33.33% (Passed 132 times, Failed 66 times)

Stack Traces | 0.041s run time
build_posterior_arguments = {'posterior_parameters': VIPosteriorParameters(q='maf', vi_method='fKL', num_transforms=5, hidden_features=50, z_score_theta='independent', z_score_x='independent'), 'vi_method': 'IW'}
get_inference = <sbi.inference.trainers.nre.nre_b.NRE_B object at 0x7eff5968e200>

    @pytest.mark.parametrize(
        "build_posterior_arguments",
        [
            dict(
                mcmc_method="slice_pymc",
                posterior_parameters=MCMCPosteriorParameters(method="hmc_pyro"),
            ),
            dict(
                vi_method="IW",
                posterior_parameters=VIPosteriorParameters(vi_method="fKL"),
            ),
        ],
    )
    def test_build_posterior_warns_on_conflicting_args(
        build_posterior_arguments, get_inference
    ):
        """
        Test that build_posterior raises a UserWarning on conflicting parameter
        combinations.
        """
        inference = get_inference
    
        with pytest.warns(UserWarning, match="ignored in favor of"):
>           inference.build_posterior(**build_posterior_arguments)

tests/posterior_parameters_test.py:198: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
.../trainers/nre/nre_base.py:360: in build_posterior
    return super().build_posterior(
.../inference/trainers/base.py:507: in build_posterior
    self._posterior = self._create_posterior(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <sbi.inference.trainers.nre.nre_b.NRE_B object at 0x7eff5968e200>
estimator = RatioEstimator(
  (net): ResidualNet(
    (initial_layer): Linear(in_features=6, out_features=50, bias=True)
    (bloc...Standardize()
    (1): Identity()
  )
  (embedding_net_x): Sequential(
    (0): Standardize()
    (1): Identity()
  )
)
prior = BoxUniform(Uniform(low: torch.Size([3]), high: torch.Size([3])), 1)
sample_with = 'mcmc', device = 'cpu'
posterior_parameters = VIPosteriorParameters(q='maf', vi_method='fKL', num_transforms=5, hidden_features=50, z_score_theta='independent', z_score_x='independent')

    def _create_posterior(
        self,
        estimator: ConditionalEstimator,
        prior: Distribution,
        sample_with: Literal[
            "mcmc", "rejection", "vi", "importance", "direct", "sde", "ode"
        ],
        device: Union[str, torch.device],
        posterior_parameters: PosteriorParameters,
    ) -> NeuralPosterior:
        """
        Create a posterior object using the specified inference method.
    
        Depending on the value of `sample_with`, this method instantiates one of the
        supported posterior inference strategies.
    
        Args:
            estimator: The estimator that the posterior is based on.
            prior: A probability distribution that expresses prior knowledge about the
                parameters, e.g. which ranges are meaningful for them. Must be a PyTorch
                distribution, see FAQ for details on how to use custom distributions.
            sample_with: The inference method to use. Must be one of:
                - "mcmc"
                - "rejection"
                - "vi"
                - "importance"
                - "direct"
                - "sde"
                - "ode"
            device: torch device on which to train the neural net and on which to
                perform all posterior operations, e.g. gpu or cpu.
            posterior_parameters: Configuration passed to the init method for the
                posterior. Must be of type PosteriorParameters.
    
        Returns:
            NeuralPosterior object.
        """
    
        if isinstance(posterior_parameters, DirectPosteriorParameters):
            posterior_estimator = estimator
            if not isinstance(posterior_estimator, ConditionalDensityEstimator):
                raise TypeError(
                    f"Expected posterior_estimator to be an instance of "
                    " ConditionalDensityEstimator, "
                    f"but got {type(posterior_estimator).__name__} instead."
                )
            posterior = DirectPosterior(
                posterior_estimator=posterior_estimator,
                prior=prior,
                device=device,
                **asdict(posterior_parameters),
            )
        elif isinstance(posterior_parameters, VectorFieldPosteriorParameters):
            vector_field_estimator = estimator
            if not isinstance(vector_field_estimator, ConditionalVectorFieldEstimator):
                raise TypeError(
                    f"Expected vector_field_estimator to be an instance of "
                    " ConditionalVectorFieldEstimator, "
                    f"but got {type(vector_field_estimator).__name__} instead."
                )
            if sample_with not in ("ode", "sde"):
                raise ValueError(
                    "`sample_with` must be either",
                    f" 'ode' or 'sde', got '{sample_with}'",
                )
            posterior = VectorFieldPosterior(
                vector_field_estimator=vector_field_estimator,
                prior=prior,
                device=device,
                sample_with=sample_with,
                **asdict(posterior_parameters),
            )
        else:
            # Posteriors requiring potential_fn and theta_transform
            potential_fn, theta_transform = self._get_potential_function(
                prior, estimator
            )
            if isinstance(posterior_parameters, MCMCPosteriorParameters):
                posterior = MCMCPosterior(
                    potential_fn=potential_fn,
                    theta_transform=theta_transform,
                    proposal=prior,
                    device=device,
                    **asdict(posterior_parameters),
                )
            elif isinstance(posterior_parameters, RejectionPosteriorParameters):
                posterior = RejectionPosterior(
                    potential_fn=potential_fn,
                    proposal=prior,
                    device=device,
                    **asdict(posterior_parameters),
                )
            elif isinstance(posterior_parameters, VIPosteriorParameters):
>               posterior = VIPosterior(
                    potential_fn=potential_fn,
                    theta_transform=theta_transform,
                    prior=prior,
                    device=device,
                    **asdict(posterior_parameters),
E                   TypeError: VIPosterior.__init__() got an unexpected keyword argument 'num_transforms'

.../inference/trainers/base.py:899: TypeError
tests/posterior_parameters_test.py::test_build_posterior_works_on_default_args[build_posterior_arguments1]

Flake rate in main: 33.33% (Passed 132 times, Failed 66 times)

Stack Traces | 0.054s run time
build_posterior_arguments = {'posterior_parameters': VIPosteriorParameters(q='maf', vi_method='rKL', num_transforms=5, hidden_features=50, z_score_theta='independent', z_score_x='independent')}
get_inference = <sbi.inference.trainers.nre.nre_b.NRE_B object at 0x7f0fd8b693f0>

    @pytest.mark.parametrize(
        "build_posterior_arguments",
        [
            pytest.param(
                dict(
                    posterior_parameters=MCMCPosteriorParameters(
                        method="slice_np_vectorized"
                    ),
                ),
            ),
            pytest.param(
                dict(
                    posterior_parameters=VIPosteriorParameters(vi_method="rKL"),
                ),
            ),
        ],
    )
    def test_build_posterior_works_on_default_args(
        build_posterior_arguments, get_inference
    ):
        """
        Test that build_posterior doesn't raise on default parameters.
        """
    
        inference = get_inference
>       inference.build_posterior(**build_posterior_arguments)

tests/posterior_parameters_test.py:226: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
.../trainers/nre/nre_base.py:360: in build_posterior
    return super().build_posterior(
.../inference/trainers/base.py:507: in build_posterior
    self._posterior = self._create_posterior(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <sbi.inference.trainers.nre.nre_b.NRE_B object at 0x7f0fd8b693f0>
estimator = RatioEstimator(
  (net): ResidualNet(
    (initial_layer): Linear(in_features=6, out_features=50, bias=True)
    (bloc...Standardize()
    (1): Identity()
  )
  (embedding_net_x): Sequential(
    (0): Standardize()
    (1): Identity()
  )
)
prior = BoxUniform(Uniform(low: torch.Size([3]), high: torch.Size([3])), 1)
sample_with = 'mcmc', device = 'cpu'
posterior_parameters = VIPosteriorParameters(q='maf', vi_method='rKL', num_transforms=5, hidden_features=50, z_score_theta='independent', z_score_x='independent')

    def _create_posterior(
        self,
        estimator: ConditionalEstimator,
        prior: Distribution,
        sample_with: Literal[
            "mcmc", "rejection", "vi", "importance", "direct", "sde", "ode"
        ],
        device: Union[str, torch.device],
        posterior_parameters: PosteriorParameters,
    ) -> NeuralPosterior:
        """
        Create a posterior object using the specified inference method.
    
        Depending on the value of `sample_with`, this method instantiates one of the
        supported posterior inference strategies.
    
        Args:
            estimator: The estimator that the posterior is based on.
            prior: A probability distribution that expresses prior knowledge about the
                parameters, e.g. which ranges are meaningful for them. Must be a PyTorch
                distribution, see FAQ for details on how to use custom distributions.
            sample_with: The inference method to use. Must be one of:
                - "mcmc"
                - "rejection"
                - "vi"
                - "importance"
                - "direct"
                - "sde"
                - "ode"
            device: torch device on which to train the neural net and on which to
                perform all posterior operations, e.g. gpu or cpu.
            posterior_parameters: Configuration passed to the init method for the
                posterior. Must be of type PosteriorParameters.
    
        Returns:
            NeuralPosterior object.
        """
    
        if isinstance(posterior_parameters, DirectPosteriorParameters):
            posterior_estimator = estimator
            if not isinstance(posterior_estimator, ConditionalDensityEstimator):
                raise TypeError(
                    f"Expected posterior_estimator to be an instance of "
                    " ConditionalDensityEstimator, "
                    f"but got {type(posterior_estimator).__name__} instead."
                )
            posterior = DirectPosterior(
                posterior_estimator=posterior_estimator,
                prior=prior,
                device=device,
                **asdict(posterior_parameters),
            )
        elif isinstance(posterior_parameters, VectorFieldPosteriorParameters):
            vector_field_estimator = estimator
            if not isinstance(vector_field_estimator, ConditionalVectorFieldEstimator):
                raise TypeError(
                    f"Expected vector_field_estimator to be an instance of "
                    " ConditionalVectorFieldEstimator, "
                    f"but got {type(vector_field_estimator).__name__} instead."
                )
            if sample_with not in ("ode", "sde"):
                raise ValueError(
                    "`sample_with` must be either",
                    f" 'ode' or 'sde', got '{sample_with}'",
                )
            posterior = VectorFieldPosterior(
                vector_field_estimator=vector_field_estimator,
                prior=prior,
                device=device,
                sample_with=sample_with,
                **asdict(posterior_parameters),
            )
        else:
            # Posteriors requiring potential_fn and theta_transform
            potential_fn, theta_transform = self._get_potential_function(
                prior, estimator
            )
            if isinstance(posterior_parameters, MCMCPosteriorParameters):
                posterior = MCMCPosterior(
                    potential_fn=potential_fn,
                    theta_transform=theta_transform,
                    proposal=prior,
                    device=device,
                    **asdict(posterior_parameters),
                )
            elif isinstance(posterior_parameters, RejectionPosteriorParameters):
                posterior = RejectionPosterior(
                    potential_fn=potential_fn,
                    proposal=prior,
                    device=device,
                    **asdict(posterior_parameters),
                )
            elif isinstance(posterior_parameters, VIPosteriorParameters):
>               posterior = VIPosterior(
                    potential_fn=potential_fn,
                    theta_transform=theta_transform,
                    prior=prior,
                    device=device,
                    **asdict(posterior_parameters),
E                   TypeError: VIPosterior.__init__() got an unexpected keyword argument 'num_transforms'

.../inference/trainers/base.py:899: TypeError
tests/posterior_parameters_test.py::test_signature_consistency[VIPosteriorParameters-VIPosterior-skipped_fields_and_parameters4]

Flake rate in main: 33.33% (Passed 132 times, Failed 66 times)

Stack Traces | 0.004s run time
parameter_dataclass = <class 'sbi.inference.posteriors.posterior_parameters.VIPosteriorParameters'>
init_target_class = <class 'sbi.inference.posteriors.vi_posterior.VIPosterior'>
skipped_fields_and_parameters = {'device', 'potential_fn', 'prior', 'self', 'theta_transform', 'x_shape'}

    @pytest.mark.parametrize(
        ("parameter_dataclass", "init_target_class", "skipped_fields_and_parameters"),
        [
            (
                DirectPosteriorParameters,
                DirectPosterior,
                {"posterior_estimator", "prior", "device"},
            ),
            (
                ImportanceSamplingPosteriorParameters,
                ImportanceSamplingPosterior,
                {"potential_fn", "proposal", "device"},
            ),
            (
                MCMCPosteriorParameters,
                MCMCPosterior,
                {
                    "potential_fn",
                    "proposal",
                    "device",
                    "theta_transform",
                    "init_strategy_num_candidates",
                },
            ),
            (
                RejectionPosteriorParameters,
                RejectionPosterior,
                {"potential_fn", "device", "proposal"},
            ),
            (
                VIPosteriorParameters,
                VIPosterior,
                {"potential_fn", "prior", "theta_transform", "device"},
            ),
            (
                VectorFieldPosteriorParameters,
                VectorFieldPosterior,
                {
                    "vector_field_estimator",
                    "device",
                    "prior",
                    "sample_with",
                    "iid_method",
                    "iid_params",
                    "neural_ode_backend",
                    "neural_ode_kwargs",
                },
            ),
            (
                VectorFieldPosteriorParameters,
                VectorFieldBasedPotential,
                {
                    "vector_field_estimator",
                    "device",
                    "prior",
                    "x_o",
                    "enable_transform",
                    "max_sampling_batch_size",
                },
            ),
        ],
    )
    def test_signature_consistency(
        parameter_dataclass, init_target_class, skipped_fields_and_parameters
    ):
        """
        Test that the constructor (__init__) signature of a target class matches the
        signature of a corresponding parameter dataclass.
    
        This function compares the argument names, default values, and type annotations
        between the dataclass and the target class __init__ method, ignoring specified
        parameters passed in `skipped_fields_and_parameters`.
    
        Args:
            parameter_dataclass: The dataclass whose signature is used as reference.
            init_target_class: The class whose __init__ method signature is compared.
            skipped_fields_and_parameters (set): A set of parameter names to ignore during
                comparison (e.g., 'self', or fields not relevant for matching).
    
        Raises:
            AssertionError: If there is any mismatch in parameter names, default values,
                or type annotations between the dataclass and the class constructor.
        """
        dataclass_signature = inspect.signature(parameter_dataclass)
        class_signature = inspect.signature(init_target_class.__init__)
    
        skipped_fields_and_parameters.add("self")
        skipped_fields_and_parameters.add("x_shape")
    
        class_dict = {
            name: param
            for name, param in class_signature.parameters.items()
            if name not in skipped_fields_and_parameters
            and param.kind != inspect.Parameter.VAR_KEYWORD
        }
    
        dataclass_dict = {
            name: param
            for name, param in dataclass_signature.parameters.items()
            if name not in skipped_fields_and_parameters
        }
    
        # Compare if the dataclass and posterior_class have the same argument names
>       assert class_dict.keys() == dataclass_dict.keys(), (
            f"Parameter mismatch:\n"
            f"In class but not dataclass: {class_dict.keys() - dataclass_dict.keys()}\n"
            f"In dataclass but not class: {dataclass_dict.keys() - class_dict.keys()}"
        )
E       AssertionError: Parameter mismatch:
E         In class but not dataclass: {'parameters', 'modules'}
E         In dataclass but not class: {'hidden_features', 'z_score_theta', 'z_score_x', 'num_transforms'}
E       assert dict_keys(['q...', 'modules']) == dict_keys(['q... 'z_score_x'])
E         
E         Full diff:
E         - dict_keys(['q', 'vi_method', 'num_transforms', 'hidden_features', 'z_score_theta', 'z_score_x'])
E         + dict_keys(['q', 'vi_method', 'parameters', 'modules'])

tests/posterior_parameters_test.py:148: AssertionError
tests/save_and_load_test.py::test_picklability[NRE_B-VIPosteriorParameters]

Flake rate in main: 35.05% (Passed 63 times, Failed 34 times)

Stack Traces | 0.047s run time
inference_method = <class 'sbi.inference.trainers.nre.nre_b.NRE_B'>
posterior_parameters = <class 'sbi.inference.posteriors.posterior_parameters.VIPosteriorParameters'>
tmp_path = PosixPath('.../pytest-0/popen-gw1/test_picklability_NRE_B_VIPost0')
mcmc_params_fast = MCMCPosteriorParameters(method='slice_np_vectorized', thin=1, warmup_steps=1, num_chains=1, init_strategy='resample', init_strategy_parameters=None, num_workers=1, mp_context='spawn')

    @pytest.mark.parametrize(
        "inference_method, posterior_parameters",
        (
            (NPE, DirectPosteriorParameters),
            (NPSE, VectorFieldPosteriorParameters),
            (FMPE, VectorFieldPosteriorParameters),
            pytest.param(NLE, MCMCPosteriorParameters, marks=pytest.mark.mcmc),
            pytest.param(NRE, MCMCPosteriorParameters, marks=pytest.mark.mcmc),
            pytest.param(NRE, VIPosteriorParameters, marks=pytest.mark.mcmc),
            (NRE, RejectionPosteriorParameters),
        ),
    )
    def test_picklability(
        inference_method,
        posterior_parameters,
        tmp_path,
        mcmc_params_fast: MCMCPosteriorParameters,
    ):
        num_dim = 2
        prior = utils.BoxUniform(low=-2 * torch.ones(num_dim), high=2 * torch.ones(num_dim))
        x_o = torch.zeros(1, num_dim)
    
        theta = prior.sample((500,))
        x = theta + 1.0 + torch.randn_like(theta) * 0.1
    
        inference = inference_method(prior=prior)
        _ = inference.append_simulations(theta, x).train(max_num_epochs=1)
        if posterior_parameters is MCMCPosteriorParameters:
            posterior = inference.build_posterior(
                posterior_parameters=mcmc_params_fast
            ).set_default_x(x_o)
        else:
>           posterior = inference.build_posterior(
                posterior_parameters=posterior_parameters()
            ).set_default_x(x_o)

tests/save_and_load_test.py:53: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
.../trainers/nre/nre_base.py:360: in build_posterior
    return super().build_posterior(
.../inference/trainers/base.py:507: in build_posterior
    self._posterior = self._create_posterior(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <sbi.inference.trainers.nre.nre_b.NRE_B object at 0x7f8c8cccb1c0>
estimator = RatioEstimator(
  (net): ResidualNet(
    (initial_layer): Linear(in_features=4, out_features=50, bias=True)
    (bloc...Standardize()
    (1): Identity()
  )
  (embedding_net_x): Sequential(
    (0): Standardize()
    (1): Identity()
  )
)
prior = BoxUniform(Uniform(low: torch.Size([2]), high: torch.Size([2])), 1)
sample_with = 'mcmc', device = 'cpu'
posterior_parameters = VIPosteriorParameters(q='maf', vi_method='rKL', num_transforms=5, hidden_features=50, z_score_theta='independent', z_score_x='independent')

    def _create_posterior(
        self,
        estimator: ConditionalEstimator,
        prior: Distribution,
        sample_with: Literal[
            "mcmc", "rejection", "vi", "importance", "direct", "sde", "ode"
        ],
        device: Union[str, torch.device],
        posterior_parameters: PosteriorParameters,
    ) -> NeuralPosterior:
        """
        Create a posterior object using the specified inference method.
    
        Depending on the value of `sample_with`, this method instantiates one of the
        supported posterior inference strategies.
    
        Args:
            estimator: The estimator that the posterior is based on.
            prior: A probability distribution that expresses prior knowledge about the
                parameters, e.g. which ranges are meaningful for them. Must be a PyTorch
                distribution, see FAQ for details on how to use custom distributions.
            sample_with: The inference method to use. Must be one of:
                - "mcmc"
                - "rejection"
                - "vi"
                - "importance"
                - "direct"
                - "sde"
                - "ode"
            device: torch device on which to train the neural net and on which to
                perform all posterior operations, e.g. gpu or cpu.
            posterior_parameters: Configuration passed to the init method for the
                posterior. Must be of type PosteriorParameters.
    
        Returns:
            NeuralPosterior object.
        """
    
        if isinstance(posterior_parameters, DirectPosteriorParameters):
            posterior_estimator = estimator
            if not isinstance(posterior_estimator, ConditionalDensityEstimator):
                raise TypeError(
                    f"Expected posterior_estimator to be an instance of "
                    " ConditionalDensityEstimator, "
                    f"but got {type(posterior_estimator).__name__} instead."
                )
            posterior = DirectPosterior(
                posterior_estimator=posterior_estimator,
                prior=prior,
                device=device,
                **asdict(posterior_parameters),
            )
        elif isinstance(posterior_parameters, VectorFieldPosteriorParameters):
            vector_field_estimator = estimator
            if not isinstance(vector_field_estimator, ConditionalVectorFieldEstimator):
                raise TypeError(
                    f"Expected vector_field_estimator to be an instance of "
                    " ConditionalVectorFieldEstimator, "
                    f"but got {type(vector_field_estimator).__name__} instead."
                )
            if sample_with not in ("ode", "sde"):
                raise ValueError(
                    "`sample_with` must be either",
                    f" 'ode' or 'sde', got '{sample_with}'",
                )
            posterior = VectorFieldPosterior(
                vector_field_estimator=vector_field_estimator,
                prior=prior,
                device=device,
                sample_with=sample_with,
                **asdict(posterior_parameters),
            )
        else:
            # Posteriors requiring potential_fn and theta_transform
            potential_fn, theta_transform = self._get_potential_function(
                prior, estimator
            )
            if isinstance(posterior_parameters, MCMCPosteriorParameters):
                posterior = MCMCPosterior(
                    potential_fn=potential_fn,
                    theta_transform=theta_transform,
                    proposal=prior,
                    device=device,
                    **asdict(posterior_parameters),
                )
            elif isinstance(posterior_parameters, RejectionPosteriorParameters):
                posterior = RejectionPosterior(
                    potential_fn=potential_fn,
                    proposal=prior,
                    device=device,
                    **asdict(posterior_parameters),
                )
            elif isinstance(posterior_parameters, VIPosteriorParameters):
>               posterior = VIPosterior(
                    potential_fn=potential_fn,
                    theta_transform=theta_transform,
                    prior=prior,
                    device=device,
                    **asdict(posterior_parameters),
E                   TypeError: VIPosterior.__init__() got an unexpected keyword argument 'num_transforms'

.../inference/trainers/base.py:899: TypeError
tests/sbc_test.py::test_running_sbc[NLE_A-vi-boxuniform-marginals]

Flake rate in main: 35.05% (Passed 63 times, Failed 34 times)

Stack Traces | 0.09s run time
method = <class 'sbi.inference.trainers.nle.nle_a.NLE_A'>
prior_type = 'boxuniform', reduce_fn_str = 'marginals', sampler = 'vi'
mcmc_params_fast = MCMCPosteriorParameters(method='slice_np_vectorized', thin=1, warmup_steps=1, num_chains=1, init_strategy='resample', init_strategy_parameters=None, num_workers=1, mp_context='spawn')

    @pytest.mark.parametrize("reduce_fn_str", ("marginals", "posterior_log_prob"))
    @pytest.mark.parametrize("prior_type", ("boxuniform", "independent"))
    @pytest.mark.parametrize(
        "method, sampler",
        (
            (NPE, None),
            pytest.param(NLE, "mcmc", marks=pytest.mark.mcmc),
            pytest.param(NLE, "vi", marks=pytest.mark.mcmc),
            (NPSE, None),
        ),
    )
    def test_running_sbc(
        method,
        prior_type: str,
        reduce_fn_str: str,
        sampler: Optional[str],
        mcmc_params_fast: MCMCPosteriorParameters,
    ):
        """Test running inference and then SBC and obtaining nltp with different methods."""
        # Setup
        num_dim = 2
        if prior_type == "boxuniform":
            prior = BoxUniform(-torch.ones(num_dim), torch.ones(num_dim))
        else:
            prior = MultipleIndependent([
                Uniform(-torch.ones(1), torch.ones(1)) for _ in range(num_dim)
            ])
    
        # Test parameters
        num_simulations = 100
        max_num_epochs = 1
        num_sbc_runs = 2
        num_posterior_samples = 20
    
        likelihood_shift = -1.0 * ones(num_dim)
        likelihood_cov = 0.3 * eye(num_dim)
    
        # Helper function to simulate data
        def simulator(theta):
            return linear_gaussian(theta, likelihood_shift, likelihood_cov)
    
        # Build posterior
        posterior_kwargs = {}
        if method == NLE:
            posterior_kwargs = {
                "posterior_parameters": mcmc_params_fast
                if sampler == "mcmc"
                else VIPosteriorParameters()
            }
    
>       posterior = train_inference_method(
            method,
            prior,
            simulator,
            num_simulations=num_simulations,
            max_num_epochs=max_num_epochs,
            **posterior_kwargs,
        )

tests/sbc_test.py:118: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
tests/sbc_test.py:63: in train_inference_method
    posterior = inferer.build_posterior(**kwargs)
.../trainers/nle/nle_base.py:291: in build_posterior
    return super().build_posterior(
.../inference/trainers/base.py:507: in build_posterior
    self._posterior = self._create_posterior(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <sbi.inference.trainers.nle.nle_a.NLE_A object at 0x7f0fdafb2140>
estimator = NFlowsFlow(
  (net): Flow(
    (_transform): CompositeTransform(
      (_transforms): ModuleList(
        (0): Pointwi...ibution): StandardNormal()
    (_embedding_net): Sequential(
      (0): Standardize()
      (1): Identity()
    )
  )
)
prior = BoxUniform(Uniform(low: torch.Size([2]), high: torch.Size([2])), 1)
sample_with = 'mcmc', device = 'cpu'
posterior_parameters = VIPosteriorParameters(q='maf', vi_method='rKL', num_transforms=5, hidden_features=50, z_score_theta='independent', z_score_x='independent')

    def _create_posterior(
        self,
        estimator: ConditionalEstimator,
        prior: Distribution,
        sample_with: Literal[
            "mcmc", "rejection", "vi", "importance", "direct", "sde", "ode"
        ],
        device: Union[str, torch.device],
        posterior_parameters: PosteriorParameters,
    ) -> NeuralPosterior:
        """
        Create a posterior object using the specified inference method.
    
        Depending on the value of `sample_with`, this method instantiates one of the
        supported posterior inference strategies.
    
        Args:
            estimator: The estimator that the posterior is based on.
            prior: A probability distribution that expresses prior knowledge about the
                parameters, e.g. which ranges are meaningful for them. Must be a PyTorch
                distribution, see FAQ for details on how to use custom distributions.
            sample_with: The inference method to use. Must be one of:
                - "mcmc"
                - "rejection"
                - "vi"
                - "importance"
                - "direct"
                - "sde"
                - "ode"
            device: torch device on which to train the neural net and on which to
                perform all posterior operations, e.g. gpu or cpu.
            posterior_parameters: Configuration passed to the init method for the
                posterior. Must be of type PosteriorParameters.
    
        Returns:
            NeuralPosterior object.
        """
    
        if isinstance(posterior_parameters, DirectPosteriorParameters):
            posterior_estimator = estimator
            if not isinstance(posterior_estimator, ConditionalDensityEstimator):
                raise TypeError(
                    f"Expected posterior_estimator to be an instance of "
                    " ConditionalDensityEstimator, "
                    f"but got {type(posterior_estimator).__name__} instead."
                )
            posterior = DirectPosterior(
                posterior_estimator=posterior_estimator,
                prior=prior,
                device=device,
                **asdict(posterior_parameters),
            )
        elif isinstance(posterior_parameters, VectorFieldPosteriorParameters):
            vector_field_estimator = estimator
            if not isinstance(vector_field_estimator, ConditionalVectorFieldEstimator):
                raise TypeError(
                    f"Expected vector_field_estimator to be an instance of "
                    " ConditionalVectorFieldEstimator, "
                    f"but got {type(vector_field_estimator).__name__} instead."
                )
            if sample_with not in ("ode", "sde"):
                raise ValueError(
                    "`sample_with` must be either",
                    f" 'ode' or 'sde', got '{sample_with}'",
                )
            posterior = VectorFieldPosterior(
                vector_field_estimator=vector_field_estimator,
                prior=prior,
                device=device,
                sample_with=sample_with,
                **asdict(posterior_parameters),
            )
        else:
            # Posteriors requiring potential_fn and theta_transform
            potential_fn, theta_transform = self._get_potential_function(
                prior, estimator
            )
            if isinstance(posterior_parameters, MCMCPosteriorParameters):
                posterior = MCMCPosterior(
                    potential_fn=potential_fn,
                    theta_transform=theta_transform,
                    proposal=prior,
                    device=device,
                    **asdict(posterior_parameters),
                )
            elif isinstance(posterior_parameters, RejectionPosteriorParameters):
                posterior = RejectionPosterior(
                    potential_fn=potential_fn,
                    proposal=prior,
                    device=device,
                    **asdict(posterior_parameters),
                )
            elif isinstance(posterior_parameters, VIPosteriorParameters):
>               posterior = VIPosterior(
                    potential_fn=potential_fn,
                    theta_transform=theta_transform,
                    prior=prior,
                    device=device,
                    **asdict(posterior_parameters),
E                   TypeError: VIPosterior.__init__() got an unexpected keyword argument 'num_transforms'

.../inference/trainers/base.py:899: TypeError
tests/sbc_test.py::test_running_sbc[NLE_A-vi-boxuniform-posterior_log_prob]

Flake rate in main: 35.05% (Passed 63 times, Failed 34 times)

Stack Traces | 0.089s run time
method = <class 'sbi.inference.trainers.nle.nle_a.NLE_A'>
prior_type = 'boxuniform', reduce_fn_str = 'posterior_log_prob', sampler = 'vi'
mcmc_params_fast = MCMCPosteriorParameters(method='slice_np_vectorized', thin=1, warmup_steps=1, num_chains=1, init_strategy='resample', init_strategy_parameters=None, num_workers=1, mp_context='spawn')

    @pytest.mark.parametrize("reduce_fn_str", ("marginals", "posterior_log_prob"))
    @pytest.mark.parametrize("prior_type", ("boxuniform", "independent"))
    @pytest.mark.parametrize(
        "method, sampler",
        (
            (NPE, None),
            pytest.param(NLE, "mcmc", marks=pytest.mark.mcmc),
            pytest.param(NLE, "vi", marks=pytest.mark.mcmc),
            (NPSE, None),
        ),
    )
    def test_running_sbc(
        method,
        prior_type: str,
        reduce_fn_str: str,
        sampler: Optional[str],
        mcmc_params_fast: MCMCPosteriorParameters,
    ):
        """Test running inference and then SBC and obtaining nltp with different methods."""
        # Setup
        num_dim = 2
        if prior_type == "boxuniform":
            prior = BoxUniform(-torch.ones(num_dim), torch.ones(num_dim))
        else:
            prior = MultipleIndependent([
                Uniform(-torch.ones(1), torch.ones(1)) for _ in range(num_dim)
            ])
    
        # Test parameters
        num_simulations = 100
        max_num_epochs = 1
        num_sbc_runs = 2
        num_posterior_samples = 20
    
        likelihood_shift = -1.0 * ones(num_dim)
        likelihood_cov = 0.3 * eye(num_dim)
    
        # Helper function to simulate data
        def simulator(theta):
            return linear_gaussian(theta, likelihood_shift, likelihood_cov)
    
        # Build posterior
        posterior_kwargs = {}
        if method == NLE:
            posterior_kwargs = {
                "posterior_parameters": mcmc_params_fast
                if sampler == "mcmc"
                else VIPosteriorParameters()
            }
    
>       posterior = train_inference_method(
            method,
            prior,
            simulator,
            num_simulations=num_simulations,
            max_num_epochs=max_num_epochs,
            **posterior_kwargs,
        )

tests/sbc_test.py:118: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
tests/sbc_test.py:63: in train_inference_method
    posterior = inferer.build_posterior(**kwargs)
.../trainers/nle/nle_base.py:291: in build_posterior
    return super().build_posterior(
.../inference/trainers/base.py:507: in build_posterior
    self._posterior = self._create_posterior(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <sbi.inference.trainers.nle.nle_a.NLE_A object at 0x7f0fe47d7cd0>
estimator = NFlowsFlow(
  (net): Flow(
    (_transform): CompositeTransform(
      (_transforms): ModuleList(
        (0): Pointwi...ibution): StandardNormal()
    (_embedding_net): Sequential(
      (0): Standardize()
      (1): Identity()
    )
  )
)
prior = BoxUniform(Uniform(low: torch.Size([2]), high: torch.Size([2])), 1)
sample_with = 'mcmc', device = 'cpu'
posterior_parameters = VIPosteriorParameters(q='maf', vi_method='rKL', num_transforms=5, hidden_features=50, z_score_theta='independent', z_score_x='independent')

    def _create_posterior(
        self,
        estimator: ConditionalEstimator,
        prior: Distribution,
        sample_with: Literal[
            "mcmc", "rejection", "vi", "importance", "direct", "sde", "ode"
        ],
        device: Union[str, torch.device],
        posterior_parameters: PosteriorParameters,
    ) -> NeuralPosterior:
        """
        Create a posterior object using the specified inference method.
    
        Depending on the value of `sample_with`, this method instantiates one of the
        supported posterior inference strategies.
    
        Args:
            estimator: The estimator that the posterior is based on.
            prior: A probability distribution that expresses prior knowledge about the
                parameters, e.g. which ranges are meaningful for them. Must be a PyTorch
                distribution, see FAQ for details on how to use custom distributions.
            sample_with: The inference method to use. Must be one of:
                - "mcmc"
                - "rejection"
                - "vi"
                - "importance"
                - "direct"
                - "sde"
                - "ode"
            device: torch device on which to train the neural net and on which to
                perform all posterior operations, e.g. gpu or cpu.
            posterior_parameters: Configuration passed to the init method for the
                posterior. Must be of type PosteriorParameters.
    
        Returns:
            NeuralPosterior object.
        """
    
        if isinstance(posterior_parameters, DirectPosteriorParameters):
            posterior_estimator = estimator
            if not isinstance(posterior_estimator, ConditionalDensityEstimator):
                raise TypeError(
                    f"Expected posterior_estimator to be an instance of "
                    " ConditionalDensityEstimator, "
                    f"but got {type(posterior_estimator).__name__} instead."
                )
            posterior = DirectPosterior(
                posterior_estimator=posterior_estimator,
                prior=prior,
                device=device,
                **asdict(posterior_parameters),
            )
        elif isinstance(posterior_parameters, VectorFieldPosteriorParameters):
            vector_field_estimator = estimator
            if not isinstance(vector_field_estimator, ConditionalVectorFieldEstimator):
                raise TypeError(
                    f"Expected vector_field_estimator to be an instance of "
                    " ConditionalVectorFieldEstimator, "
                    f"but got {type(vector_field_estimator).__name__} instead."
                )
            if sample_with not in ("ode", "sde"):
                raise ValueError(
                    "`sample_with` must be either",
                    f" 'ode' or 'sde', got '{sample_with}'",
                )
            posterior = VectorFieldPosterior(
                vector_field_estimator=vector_field_estimator,
                prior=prior,
                device=device,
                sample_with=sample_with,
                **asdict(posterior_parameters),
            )
        else:
            # Posteriors requiring potential_fn and theta_transform
            potential_fn, theta_transform = self._get_potential_function(
                prior, estimator
            )
            if isinstance(posterior_parameters, MCMCPosteriorParameters):
                posterior = MCMCPosterior(
                    potential_fn=potential_fn,
                    theta_transform=theta_transform,
                    proposal=prior,
                    device=device,
                    **asdict(posterior_parameters),
                )
            elif isinstance(posterior_parameters, RejectionPosteriorParameters):
                posterior = RejectionPosterior(
                    potential_fn=potential_fn,
                    proposal=prior,
                    device=device,
                    **asdict(posterior_parameters),
                )
            elif isinstance(posterior_parameters, VIPosteriorParameters):
>               posterior = VIPosterior(
                    potential_fn=potential_fn,
                    theta_transform=theta_transform,
                    prior=prior,
                    device=device,
                    **asdict(posterior_parameters),
E                   TypeError: VIPosterior.__init__() got an unexpected keyword argument 'num_transforms'

.../inference/trainers/base.py:899: TypeError
tests/sbc_test.py::test_running_sbc[NLE_A-vi-independent-marginals]

Flake rate in main: 35.05% (Passed 63 times, Failed 34 times)

Stack Traces | 0.085s run time
method = <class 'sbi.inference.trainers.nle.nle_a.NLE_A'>
prior_type = 'independent', reduce_fn_str = 'marginals', sampler = 'vi'
mcmc_params_fast = MCMCPosteriorParameters(method='slice_np_vectorized', thin=1, warmup_steps=1, num_chains=1, init_strategy='resample', init_strategy_parameters=None, num_workers=1, mp_context='spawn')

    @pytest.mark.parametrize("reduce_fn_str", ("marginals", "posterior_log_prob"))
    @pytest.mark.parametrize("prior_type", ("boxuniform", "independent"))
    @pytest.mark.parametrize(
        "method, sampler",
        (
            (NPE, None),
            pytest.param(NLE, "mcmc", marks=pytest.mark.mcmc),
            pytest.param(NLE, "vi", marks=pytest.mark.mcmc),
            (NPSE, None),
        ),
    )
    def test_running_sbc(
        method,
        prior_type: str,
        reduce_fn_str: str,
        sampler: Optional[str],
        mcmc_params_fast: MCMCPosteriorParameters,
    ):
        """Test running inference and then SBC and obtaining nltp with different methods."""
        # Setup
        num_dim = 2
        if prior_type == "boxuniform":
            prior = BoxUniform(-torch.ones(num_dim), torch.ones(num_dim))
        else:
            prior = MultipleIndependent([
                Uniform(-torch.ones(1), torch.ones(1)) for _ in range(num_dim)
            ])
    
        # Test parameters
        num_simulations = 100
        max_num_epochs = 1
        num_sbc_runs = 2
        num_posterior_samples = 20
    
        likelihood_shift = -1.0 * ones(num_dim)
        likelihood_cov = 0.3 * eye(num_dim)
    
        # Helper function to simulate data
        def simulator(theta):
            return linear_gaussian(theta, likelihood_shift, likelihood_cov)
    
        # Build posterior
        posterior_kwargs = {}
        if method == NLE:
            posterior_kwargs = {
                "posterior_parameters": mcmc_params_fast
                if sampler == "mcmc"
                else VIPosteriorParameters()
            }
    
>       posterior = train_inference_method(
            method,
            prior,
            simulator,
            num_simulations=num_simulations,
            max_num_epochs=max_num_epochs,
            **posterior_kwargs,
        )

tests/sbc_test.py:118: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
tests/sbc_test.py:63: in train_inference_method
    posterior = inferer.build_posterior(**kwargs)
.../trainers/nle/nle_base.py:291: in build_posterior
    return super().build_posterior(
.../inference/trainers/base.py:507: in build_posterior
    self._posterior = self._create_posterior(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <sbi.inference.trainers.nle.nle_a.NLE_A object at 0x7f6fac20d720>
estimator = NFlowsFlow(
  (net): Flow(
    (_transform): CompositeTransform(
      (_transforms): ModuleList(
        (0): Pointwi...ibution): StandardNormal()
    (_embedding_net): Sequential(
      (0): Standardize()
      (1): Identity()
    )
  )
)
prior = MultipleIndependent(), sample_with = 'mcmc', device = 'cpu'
posterior_parameters = VIPosteriorParameters(q='maf', vi_method='rKL', num_transforms=5, hidden_features=50, z_score_theta='independent', z_score_x='independent')

    def _create_posterior(
        self,
        estimator: ConditionalEstimator,
        prior: Distribution,
        sample_with: Literal[
            "mcmc", "rejection", "vi", "importance", "direct", "sde", "ode"
        ],
        device: Union[str, torch.device],
        posterior_parameters: PosteriorParameters,
    ) -> NeuralPosterior:
        """
        Create a posterior object using the specified inference method.
    
        Depending on the value of `sample_with`, this method instantiates one of the
        supported posterior inference strategies.
    
        Args:
            estimator: The estimator that the posterior is based on.
            prior: A probability distribution that expresses prior knowledge about the
                parameters, e.g. which ranges are meaningful for them. Must be a PyTorch
                distribution, see FAQ for details on how to use custom distributions.
            sample_with: The inference method to use. Must be one of:
                - "mcmc"
                - "rejection"
                - "vi"
                - "importance"
                - "direct"
                - "sde"
                - "ode"
            device: torch device on which to train the neural net and on which to
                perform all posterior operations, e.g. gpu or cpu.
            posterior_parameters: Configuration passed to the init method for the
                posterior. Must be of type PosteriorParameters.
    
        Returns:
            NeuralPosterior object.
        """
    
        if isinstance(posterior_parameters, DirectPosteriorParameters):
            posterior_estimator = estimator
            if not isinstance(posterior_estimator, ConditionalDensityEstimator):
                raise TypeError(
                    f"Expected posterior_estimator to be an instance of "
                    " ConditionalDensityEstimator, "
                    f"but got {type(posterior_estimator).__name__} instead."
                )
            posterior = DirectPosterior(
                posterior_estimator=posterior_estimator,
                prior=prior,
                device=device,
                **asdict(posterior_parameters),
            )
        elif isinstance(posterior_parameters, VectorFieldPosteriorParameters):
            vector_field_estimator = estimator
            if not isinstance(vector_field_estimator, ConditionalVectorFieldEstimator):
                raise TypeError(
                    f"Expected vector_field_estimator to be an instance of "
                    " ConditionalVectorFieldEstimator, "
                    f"but got {type(vector_field_estimator).__name__} instead."
                )
            if sample_with not in ("ode", "sde"):
                raise ValueError(
                    "`sample_with` must be either",
                    f" 'ode' or 'sde', got '{sample_with}'",
                )
            posterior = VectorFieldPosterior(
                vector_field_estimator=vector_field_estimator,
                prior=prior,
                device=device,
                sample_with=sample_with,
                **asdict(posterior_parameters),
            )
        else:
            # Posteriors requiring potential_fn and theta_transform
            potential_fn, theta_transform = self._get_potential_function(
                prior, estimator
            )
            if isinstance(posterior_parameters, MCMCPosteriorParameters):
                posterior = MCMCPosterior(
                    potential_fn=potential_fn,
                    theta_transform=theta_transform,
                    proposal=prior,
                    device=device,
                    **asdict(posterior_parameters),
                )
            elif isinstance(posterior_parameters, RejectionPosteriorParameters):
                posterior = RejectionPosterior(
                    potential_fn=potential_fn,
                    proposal=prior,
                    device=device,
                    **asdict(posterior_parameters),
                )
            elif isinstance(posterior_parameters, VIPosteriorParameters):
>               posterior = VIPosterior(
                    potential_fn=potential_fn,
                    theta_transform=theta_transform,
                    prior=prior,
                    device=device,
                    **asdict(posterior_parameters),
E                   TypeError: VIPosterior.__init__() got an unexpected keyword argument 'num_transforms'

.../inference/trainers/base.py:899: TypeError
tests/sbc_test.py::test_running_sbc[NLE_A-vi-independent-posterior_log_prob]

Flake rate in main: 35.05% (Passed 63 times, Failed 34 times)

Stack Traces | 0.087s run time
method = <class 'sbi.inference.trainers.nle.nle_a.NLE_A'>
prior_type = 'independent', reduce_fn_str = 'posterior_log_prob', sampler = 'vi'
mcmc_params_fast = MCMCPosteriorParameters(method='slice_np_vectorized', thin=1, warmup_steps=1, num_chains=1, init_strategy='resample', init_strategy_parameters=None, num_workers=1, mp_context='spawn')

    @pytest.mark.parametrize("reduce_fn_str", ("marginals", "posterior_log_prob"))
    @pytest.mark.parametrize("prior_type", ("boxuniform", "independent"))
    @pytest.mark.parametrize(
        "method, sampler",
        (
            (NPE, None),
            pytest.param(NLE, "mcmc", marks=pytest.mark.mcmc),
            pytest.param(NLE, "vi", marks=pytest.mark.mcmc),
            (NPSE, None),
        ),
    )
    def test_running_sbc(
        method,
        prior_type: str,
        reduce_fn_str: str,
        sampler: Optional[str],
        mcmc_params_fast: MCMCPosteriorParameters,
    ):
        """Test running inference and then SBC and obtaining nltp with different methods."""
        # Setup
        num_dim = 2
        if prior_type == "boxuniform":
            prior = BoxUniform(-torch.ones(num_dim), torch.ones(num_dim))
        else:
            prior = MultipleIndependent([
                Uniform(-torch.ones(1), torch.ones(1)) for _ in range(num_dim)
            ])
    
        # Test parameters
        num_simulations = 100
        max_num_epochs = 1
        num_sbc_runs = 2
        num_posterior_samples = 20
    
        likelihood_shift = -1.0 * ones(num_dim)
        likelihood_cov = 0.3 * eye(num_dim)
    
        # Helper function to simulate data
        def simulator(theta):
            return linear_gaussian(theta, likelihood_shift, likelihood_cov)
    
        # Build posterior
        posterior_kwargs = {}
        if method == NLE:
            posterior_kwargs = {
                "posterior_parameters": mcmc_params_fast
                if sampler == "mcmc"
                else VIPosteriorParameters()
            }
    
>       posterior = train_inference_method(
            method,
            prior,
            simulator,
            num_simulations=num_simulations,
            max_num_epochs=max_num_epochs,
            **posterior_kwargs,
        )

tests/sbc_test.py:118: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
tests/sbc_test.py:63: in train_inference_method
    posterior = inferer.build_posterior(**kwargs)
.../trainers/nle/nle_base.py:291: in build_posterior
    return super().build_posterior(
.../inference/trainers/base.py:507: in build_posterior
    self._posterior = self._create_posterior(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <sbi.inference.trainers.nle.nle_a.NLE_A object at 0x7eff5906e980>
estimator = NFlowsFlow(
  (net): Flow(
    (_transform): CompositeTransform(
      (_transforms): ModuleList(
        (0): Pointwi...ibution): StandardNormal()
    (_embedding_net): Sequential(
      (0): Standardize()
      (1): Identity()
    )
  )
)
prior = MultipleIndependent(), sample_with = 'mcmc', device = 'cpu'
posterior_parameters = VIPosteriorParameters(q='maf', vi_method='rKL', num_transforms=5, hidden_features=50, z_score_theta='independent', z_score_x='independent')

    def _create_posterior(
        self,
        estimator: ConditionalEstimator,
        prior: Distribution,
        sample_with: Literal[
            "mcmc", "rejection", "vi", "importance", "direct", "sde", "ode"
        ],
        device: Union[str, torch.device],
        posterior_parameters: PosteriorParameters,
    ) -> NeuralPosterior:
        """
        Create a posterior object using the specified inference method.
    
        Depending on the value of `sample_with`, this method instantiates one of the
        supported posterior inference strategies.
    
        Args:
            estimator: The estimator that the posterior is based on.
            prior: A probability distribution that expresses prior knowledge about the
                parameters, e.g. which ranges are meaningful for them. Must be a PyTorch
                distribution, see FAQ for details on how to use custom distributions.
            sample_with: The inference method to use. Must be one of:
                - "mcmc"
                - "rejection"
                - "vi"
                - "importance"
                - "direct"
                - "sde"
                - "ode"
            device: torch device on which to train the neural net and on which to
                perform all posterior operations, e.g. gpu or cpu.
            posterior_parameters: Configuration passed to the init method for the
                posterior. Must be of type PosteriorParameters.
    
        Returns:
            NeuralPosterior object.
        """
    
        if isinstance(posterior_parameters, DirectPosteriorParameters):
            posterior_estimator = estimator
            if not isinstance(posterior_estimator, ConditionalDensityEstimator):
                raise TypeError(
                    f"Expected posterior_estimator to be an instance of "
                    " ConditionalDensityEstimator, "
                    f"but got {type(posterior_estimator).__name__} instead."
                )
            posterior = DirectPosterior(
                posterior_estimator=posterior_estimator,
                prior=prior,
                device=device,
                **asdict(posterior_parameters),
            )
        elif isinstance(posterior_parameters, VectorFieldPosteriorParameters):
            vector_field_estimator = estimator
            if not isinstance(vector_field_estimator, ConditionalVectorFieldEstimator):
                raise TypeError(
                    f"Expected vector_field_estimator to be an instance of "
                    " ConditionalVectorFieldEstimator, "
                    f"but got {type(vector_field_estimator).__name__} instead."
                )
            if sample_with not in ("ode", "sde"):
                raise ValueError(
                    "`sample_with` must be either",
                    f" 'ode' or 'sde', got '{sample_with}'",
                )
            posterior = VectorFieldPosterior(
                vector_field_estimator=vector_field_estimator,
                prior=prior,
                device=device,
                sample_with=sample_with,
                **asdict(posterior_parameters),
            )
        else:
            # Posteriors requiring potential_fn and theta_transform
            potential_fn, theta_transform = self._get_potential_function(
                prior, estimator
            )
            if isinstance(posterior_parameters, MCMCPosteriorParameters):
                posterior = MCMCPosterior(
                    potential_fn=potential_fn,
                    theta_transform=theta_transform,
                    proposal=prior,
                    device=device,
                    **asdict(posterior_parameters),
                )
            elif isinstance(posterior_parameters, RejectionPosteriorParameters):
                posterior = RejectionPosterior(
                    potential_fn=potential_fn,
                    proposal=prior,
                    device=device,
                    **asdict(posterior_parameters),
                )
            elif isinstance(posterior_parameters, VIPosteriorParameters):
>               posterior = VIPosterior(
                    potential_fn=potential_fn,
                    theta_transform=theta_transform,
                    prior=prior,
                    device=device,
                    **asdict(posterior_parameters),
E                   TypeError: VIPosterior.__init__() got an unexpected keyword argument 'num_transforms'

.../inference/trainers/base.py:899: TypeError
tests/torchutils_test.py::TorchUtilsTest::test_searchsorted

Flake rate in main: 48.45% (Passed 50 times, Failed 47 times)

Stack Traces | 0.004s run time
.venv/lib/python3.10....../site-packages/xdist/remote.py:289: in pytest_runtest_logreport
    self.sendevent("testreport", data=data)
.venv/lib/python3.10....../site-packages/xdist/remote.py:126: in sendevent
    self.channel.send((name, kwargs))
.venv/lib/python3.10....................................................../site-packages/execnet/gateway_base.py:912: in send
    self.gateway._send(Message.CHANNEL_DATA, self.id, dumps_internal(item))
.venv/lib/python3.10....................................................../site-packages/execnet/gateway_base.py:1629: in dumps_internal
    return _Serializer().save(obj)  # type: ignore[return-value]
.venv/lib/python3.10....................................................../site-packages/execnet/gateway_base.py:1647: in save
    self._save(obj)
.venv/lib/python3.10....................................................../site-packages/execnet/gateway_base.py:1667: in _save
    dispatch(self, obj)
.venv/lib/python3.10....................................................../site-packages/execnet/gateway_base.py:1744: in save_tuple
    self._save(item)
.venv/lib/python3.10....................................................../site-packages/execnet/gateway_base.py:1667: in _save
    dispatch(self, obj)
.venv/lib/python3.10....................................................../site-packages/execnet/gateway_base.py:1740: in save_dict
    self._write_setitem(key, value)
.venv/lib/python3.10....................................................../site-packages/execnet/gateway_base.py:1734: in _write_setitem
    self._save(value)
.venv/lib/python3.10....................................................../site-packages/execnet/gateway_base.py:1667: in _save
    dispatch(self, obj)
.venv/lib/python3.10....................................................../site-packages/execnet/gateway_base.py:1740: in save_dict
    self._write_setitem(key, value)
.venv/lib/python3.10....................................................../site-packages/execnet/gateway_base.py:1734: in _write_setitem
    self._save(value)
.venv/lib/python3.10....................................................../site-packages/execnet/gateway_base.py:1667: in _save
    dispatch(self, obj)
.venv/lib/python3.10....................................................../site-packages/execnet/gateway_base.py:1740: in save_dict
    self._write_setitem(key, value)
.venv/lib/python3.10....................................................../site-packages/execnet/gateway_base.py:1734: in _write_setitem
    self._save(value)
.venv/lib/python3.10....................................................../site-packages/execnet/gateway_base.py:1667: in _save
    dispatch(self, obj)
.venv/lib/python3.10....................................................../site-packages/execnet/gateway_base.py:1740: in save_dict
    self._write_setitem(key, value)
.venv/lib/python3.10....................................................../site-packages/execnet/gateway_base.py:1734: in _write_setitem
    self._save(value)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <execnet.gateway_base._Serializer object at 0x7f210c584e50>
obj = tensor([0.0000, 0.1111, 0.2222, 0.3333, 0.4444, 0.5556, 0.6667, 0.7778, 0.8889])

    def _save(self, obj: object) -> None:
        tp = type(obj)
        try:
            dispatch = self._dispatch[tp]
        except KeyError:
            methodname = "save_" + tp.__name__
            meth: Callable[[_Serializer, object], None] | None = getattr(
                self.__class__, methodname, None
            )
            if meth is None:
>               raise DumpError(f"can't serialize {tp}") from None
E               execnet.gateway_base.DumpError: can't serialize <class 'torch.Tensor'>

.venv/lib/python3.10....................................................../site-packages/execnet/gateway_base.py:1665: DumpError

To view more test analytics, go to the Test Analytics Dashboard
📋 Got 3 mins? Take this short survey to help us improve Test Analytics.

Copy link
Contributor

@manuelgloeckler manuelgloeckler left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey Jan,

thanks for implementing this. I left a few comments on the tests and __init__ changes.

I wonder if its somewhat easy to connect the old VIPosterior parts with the new AmortizedVIPosterior a bit better. But this would require quite a few changes to the vi parts I think, so also happy to do the amortized VI separately.

janfb and others added 12 commits February 2, 2026 19:56
Add `build_zuko_vi_flow` function that creates unconditional Zuko
normalizing flows for variational inference training. Supports:
- NSF (Neural Spline Flow)
- MAF (Masked Autoregressive Flow)
- Gaussian (full covariance)
- Gaussian diagonal

Also includes helper `_build_zuko_gaussian_flow` with custom affine
transforms (diagonal and lower triangular) for Gaussian variants.

This addresses Phase 1, Step 1.1 of the VI unification plan.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Update DivergenceOptimizer to handle both Pyro TransformedDistribution
and ZukoUnconditionalFlow variational distributions:

- Add VariationalDistribution type alias for Union of flow types
- Detect flow type via isinstance check for ZukoUnconditionalFlow
- Handle Pyro-specific set_default_validate_args conditionally
- Properly register Zuko nn.Module flows in ModuleList

Part of VI unification (Phase 2): adapting existing VI infrastructure
to work with new Zuko-based flows alongside legacy Pyro flows.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Phase 3 of VI unification: Extend Zuko flow support to all divergence
optimizer subclasses (ElboOptimizer, ForwardKLOptimizer, RenyiDivergenceOptimizer).

Changes:
- Update warmup() to support Zuko flows using sample_and_log_prob
- Add clear error for unsupported 'identity' warmup with Zuko flows
- Fix missing 'raise' in NotImplementedError for invalid warmup methods
- Update _loss() methods to check _is_zuko flag for reparameterized sampling
- Refactor elbo_particles() to handle both Zuko and Pyro flow APIs
- Fix typo: 'inital_target' -> 'initial_target'

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Fix duplicate __all__ exports in sbi/inference/__init__.py
- Remove incorrect ZukoFlowType export from posteriors/__init__.py
- Add thread-safety lock to AmortizedVIPosterior.set_x()
- Fix missing comma in VIPosterior progress bar display

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Phase 3 Steps 3.1-3.3 of VI unification:
- Add _mode tracking attribute for single_x vs amortized modes
- Add _build_zuko_flow helper method for building Zuko flows
- Update set_q to use Zuko for maf/nsf/mcf/scf flow types
- Keep Pyro flows for gaussian/gaussian_diag for backwards compat
- Add ZukoUnconditionalFlow validation in set_q
- Fix undefined transforms bug in build_zuko_unconditional_flow

The train(x_o) signature remains unchanged. DivergenceOptimizer
(adapted in Phase 2) handles both Pyro and Zuko flows.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
…rence

Add amortized VI support to VIPosterior:
- New train_amortized(theta, x) method for training conditional flows q(θ|x)
- Updated sample() and log_prob() to handle both single-x and amortized modes
- Added _build_conditional_flow() helper using Zuko conditional flows
- Mode tracking with warnings when switching between modes
- Thread-safety lock for potential_fn state during ELBO computation
- Validation-based early stopping for amortized training

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Fix deepcopy/pickle of VIPosterior with _set_x_lock attribute:
  - __deepcopy__: Create new Lock instead of attempting deepcopy
  - __getstate__: Pop _set_x_lock from state dict (not picklable)
  - __setstate__: Restore lock immediately after restoring __dict__
- Update tests for Zuko flow compatibility:
  - Use sample() instead of rsample() for Zuko flows (which don't have rsample)
  - Skip .support attribute check for Zuko flows (they don't expose it)
- Fix typo in docstring: "due not support" -> "do not support"

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Update VIPosterior.sample() to properly handle batched x in amortized mode
  - Preserve batch dimension for multi-observation inputs
  - Squeeze singleton batch dimension to match base posterior behavior
  - Support default_x via _x_else_default_x()
- Implement sample_batched() for amortized mode (delegates to sample())
- Improve log_prob() docstring with batched x documentation
- Standardize error types: use ValueError instead of AttributeError
- Migrate all tests from AmortizedVIPosterior to VIPosterior.train_amortized()
  - Update imports and constructor calls
  - Change .train() to .train_amortized() with flow params
  - Update assertions for new mode attribute

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Remove AmortizedVIPosterior from __all__ exports in both sbi/inference/__init__.py
and sbi/inference/posteriors/__init__.py. All functionality has been migrated to
the unified VIPosterior class with its train_amortized() method.

Note: The amortized_vi_posterior.py file still exists pending manual deletion
approval. The vi_pyro_flows.py file is retained as it's still needed for
gaussian/gaussian_diag flow types.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
@janfb
Copy link
Contributor Author

janfb commented Feb 6, 2026

Thanks for the review @manuelgloeckler! As discussed offline, I made major updates to this PR to combine both single-x VI and amortized VI into one posterior class.

AI Usage: As I am experimenting with different AI coding strategies at the moment, I tried setting up a detailed "product requirement file" summarizing our goal and then set up Claude Code in a loop (similar to the RALPH setting) that runs over night and works on one step at at time, does a self-review, writes the progress into a file and then starts from scratch again with new context. Therefore, there are many commits co-authored by Claude.

It worked reasonably well, but not perfect. Claude failed fixing an issue zuko has in 1D cases: it resorts to a element-wise transform and performs much worse than pyro. My intention was to completely remove the pyro flows to have a unified framework for VI. To this end, I ended up implementing a Gaussian and diag-Gaussian density estimator in PyTorch (not pyro) for the 1D case. Please let me know if you prefer a different approach.

Summary

Unified API

Instead of a separate AmortizedVIPosterior class, we now have a single VIPosterior with two training modes:

  • train() - Single-x VI: trains unconditional q(θ) for a fixed observation
  • train_amortized(theta, x) - Amortized VI: trains conditional q(θ|x) across observations
  • both use zuko as flow backend
  # Single-x mode (unchanged)
  posterior = VIPosterior(potential_fn, prior)
  posterior.train()
  samples = posterior.sample((1000,))

  # Amortized mode
  posterior = VIPosterior(potential_fn, prior)
  posterior.train_amortized(theta, x, flow_type=ZukoFlowType.NSF)
  samples = posterior.sample((1000,), x=x_new)  # works for any x

I agree to your points on the tests:

  • Merged test_amortized_vi_training into test_amortized_vi_accuracy
  • Removed redundant comparison test (test_amortized_vs_single_x_vi and gradient flow test)
  • Added fast test (test_amortized_vi_with_fake_potential) for CI coverage
  • Added NRE support via parametrized fixture - accuracy test now runs for both NLE and NRE

@janfb janfb force-pushed the add-amortized-vip branch from 617d5a4 to 8b48387 Compare February 6, 2026 15:43
janfb and others added 2 commits February 6, 2026 21:21
Resolved conflicts:
- vi_posterior.py: Combined docstrings for sample() Args
- vi_test.py: Kept sampling_method parameterization for comprehensive testing
- vi_test.py: Removed duplicate assertion for K parameter

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
@janfb janfb requested a review from manuelgloeckler February 6, 2026 17:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Amortized Variational Inference

2 participants